Ludwig Eduard Boltzmann ( ; ; 20 February 1844 – 5 September 1906) was an Austrian mathematician and theoretical physicist. His greatest achievements were the development of statistical mechanics and the statistical explanation of the second law of thermodynamics. In 1877 he provided the current definition of entropy, , where Ω is the number of microstates whose energy equals the system's energy, interpreted as a measure of the statistical disorder of a system.
Statistical mechanics is one of the pillars of modern physics. It describes how macroscopic observations (such as temperature and pressure) are related to microscopic parameters that fluctuate around an average. It connects thermodynamic quantities (such as heat capacity) to microscopic behavior, whereas, in classical thermodynamics, the only available option would be to measure and tabulate such quantities for various materials.
Starting in 1863, Boltzmann studied mathematics and physics at the University of Vienna. He received his doctorate in 1866 and his venia legendi in 1869. Boltzmann worked closely with Josef Stefan, director of the institute of physics. It was Stefan who introduced Boltzmann to Maxwell's work.
In 1872, long before women were admitted to Austrian universities, he met Henriette von Aigentler, an aspiring teacher of mathematics and physics in Graz. She was refused permission to audit lectures unofficially. Boltzmann supported her decision to appeal, which was successful. On 17 July 1876 Ludwig Boltzmann married Henriette; they had three daughters: Henriette (1880), Ida (1884) and Else (1891); and a son, Arthur Ludwig (1881). Boltzmann went back to Graz to take up the chair of Experimental Physics. Among his students in Graz were Svante Arrhenius and Walther Nernst. He spent 14 happy years in Graz and it was there that he developed his statistical concept of nature.
Boltzmann was appointed to the Chair of Theoretical Physics at the University of Munich in Bavaria, Germany in 1890.
In 1894, Boltzmann succeeded his teacher Joseph Stefan as Professor of Theoretical Physics at the University of Vienna.
In Vienna, Boltzmann taught physics and also lectured on philosophy. Boltzmann's lectures on natural philosophy were very popular and received considerable attention. His first lecture was an enormous success: people stood all the way down the staircase outside the largest available lecture hall, and the Emperor invited him to a reception. The Boltzmann Equation: Theory and Applications, E. G. D. Cohen, W. Thirring, ed., Springer Science & Business Media, 2012
In 1905, he gave an invited course of lectures in the summer session at the University of California in Berkeley, which he described in a popular essay A German professor's trip to El Dorado.
In May 1906, Boltzmann's deteriorating mental condition (described in a letter by the Dean as "a serious form of neurasthenia") forced him to resign his position. His symptoms indicate he experienced what might today be diagnosed as bipolar disorder. Four months later he died by suicide on 5 September 1906, by hanging himself while on vacation with his wife and daughter in Duino, near Trieste (then Austria).Muir, Hazel, Eureka! Science's greatest thinkers and their key breakthroughs, p.152, Upon Boltzmann's death, Friedrich ("Fritz") Hasenöhrl became his successor in the professorial chair of physics at Vienna. He is buried in the Viennese Zentralfriedhof. His tombstone bears the inscription of Boltzmann's entropy formula: .
Boltzmann wrote treatises on philosophy such as "On the question of the objective existence of processes in inanimate nature" (1897). He was a realist.Cercignani, Carlo. Ludwig Boltzmann: The Man Who Trusted Atoms. . In his work "On Thesis of Schopenhauer's", Boltzmann refers to his philosophy as materialism and says further: "Idealism asserts that only the ego exists, the various ideas, and seeks to explain matter from them. Materialism starts from the existence of matter and seeks to explain sensations from it."
He made multiple attempts to explain the second law of thermodynamics, with the attempts ranging over many areas. He tried Helmholtz's monocycle model, a pure ensemble approach like Gibbs, a pure mechanical approach like ergodic theory, the combinatorial argument, the Molecular chaos, etc.
Most chemists, since the discoveries of John Dalton in 1808, and James Clerk Maxwell in Scotland and Josiah Willard Gibbs in the United States, shared Boltzmann's belief in and , but much of the physics establishment did not share this belief until decades later. Boltzmann had a long-running dispute with the editor of the preeminent German physics journal of his day, who refused to let Boltzmann refer to atoms and molecules as anything other than convenient theoretical constructs. Only a couple of years after Boltzmann's death, Perrin's studies of suspensions (1908–1909), based on Einstein's theoretical studies of 1905, confirmed the values of the Avogadro constant and the Boltzmann constant, convincing the world that the tiny particles really exist.
To quote Max Planck, "The connection between entropy and probability was first stated by L. Boltzmann in his kinetic theory of gases".Max Planck, p. 119. This famous formula for entropy S isThe concept of entropy was introduced by Rudolf Clausius in 1865. He was the first to enunciate the second law of thermodynamics by saying that "entropy always increases". where is the Boltzmann constant, and ln is the natural logarithm. (for Wahrscheinlichkeit, a German word meaning "probability") is the probability of occurrence of a macrostate, p. 21 or, more precisely, the number of possible microstates corresponding to the macroscopic state of a system – the number of (unobservable) "ways" in the (observable) thermodynamics state of a system that can be realized by assigning different positions and momentum to the various molecules. Boltzmann's paradigm was an ideal gas of identical particles, of which are in the th microscopic condition (range) of position and momentum. can be counted using the formula for permutations where ranges over all possible molecular conditions, and where denotes factorial. The "correction" in the denominator account for indistinguishable particles in the same condition.
Boltzmann could also be considered one of the forerunners of quantum mechanics due to his suggestion in 1877 that the energy levels of a physical system could be discrete, although Boltzmann used this as a mathematical device with no physical meaning.Boltzmann, Ludwig (1877). Translated by Sharp, K.; Matschinsky, F. "On the Relationship between the Second Fundamental Theorem of the Mechanical Theory of Heat and Probability Calculations Regarding the Conditions for Thermal Equilibrium". Sitzungberichte der Kaiserlichen Akademie der Wissenschaften. Mathematisch-Naturwissen Classe. Part II, LXXVI, 76:373-435. Vienna. Reprinted in Wissenschaftliche Abhandlungen, Vol. II, reprint 42, p. 164–223, Barth, Leipzig, 1909. Entropy 2015, 17, 1971–2009. .
An alternative to Boltzmann's formula for entropy, above, is the information entropy definition introduced in 1948 by Claude Shannon. Shannon's definition was intended for use in communication theory but is applicable in all areas. It reduces to Boltzmann's expression when all the probabilities are equal, but can, of course, be used when they are not. Its virtue is that it yields immediate results without resorting to or Stirling's approximation. Similar formulas are found, however, as far back as the work of Boltzmann, and explicitly in Gibbs (see reference).
This equation describes the time and space variation of the probability distribution for the position and momentum of a density distribution of a cloud of points in single-particle phase space. (See Hamiltonian mechanics.) The first term on the left-hand side represents the explicit time variation of the distribution function, while the second term gives the spatial variation, and the third term describes the effect of any force acting on the particles. The right-hand side of the equation represents the effect of collisions.
In principle, the above equation completely describes the dynamics of an ensemble of gas particles, given appropriate boundary conditions. This first-order differential equation has a deceptively simple appearance, since can represent an arbitrary single-particle distribution function. Also, the force acting on the particles depends directly on the velocity distribution function . The Boltzmann equation is notoriously difficult to Integral. David Hilbert spent years trying to solve it without any real success.
The form of the collision term assumed by Boltzmann was approximate. However, for an ideal gas the standard Chapman–Enskog solution of the Boltzmann equation is highly accurate. It is expected to lead to incorrect results for an ideal gas only under shock wave conditions.
Boltzmann tried for many years to "prove" the second law of thermodynamics using his gas-dynamical equation – his famous H-theorem. However the key assumption he made in formulating the collision term was "molecular chaos", an assumption which breaks CPT symmetry as is necessary for anything which could imply the second law. It was from the probabilistic assumption alone that Boltzmann's apparent success emanated, so his long dispute with Loschmidt and others over Loschmidt's paradox ultimately ended in his failure.
Finally, in the 1970s E. G. D. Cohen and J. R. Dorfman proved that a systematic (power series) extension of the Boltzmann equation to high densities is mathematically impossible. Consequently, nonequilibrium statistical mechanics for dense gases and liquids focuses on the Green–Kubo relations, the fluctuation theorem, and other approaches instead.
In particular, it was Boltzmann's attempt to reduce it to a stochastic collision function, or law of probability following from the random collisions of mechanical particles. Following Maxwell,Maxwell, J. (1871). Theory of heat. London: Longmans, Green & Co. Boltzmann modeled gas molecules as colliding billiard balls in a box, noting that with each collision nonequilibrium velocity distributions (groups of molecules moving at the same speed and in the same direction) would become increasingly disordered leading to a final state of macroscopic uniformity and maximum microscopic disorder or the state of maximum entropy (where the macroscopic uniformity corresponds to the obliteration of all field potentials or gradients).Boltzmann, L. (1974). The second law of thermodynamics. Populare Schriften, Essay 3, address to a formal meeting of the Imperial Academy of Science, 29 May 1886, reprinted in Ludwig Boltzmann, Theoretical physics and philosophical problem, S. G. Brush (Trans.). Boston: Reidel. (Original work published 1886) The second law, he argued, was thus simply the result of the fact that in a world of mechanically colliding particles disordered states are the most probable. Because there are so many more possible disordered states than ordered ones, a system will almost always be found either in the state of maximum disorder – the macrostate with the greatest number of accessible microstates such as a gas in a box at equilibrium – or moving towards it. A dynamically ordered state, one with molecules moving "at the same speed and in the same direction", Boltzmann concluded, is thus "the most improbable case conceivable...an infinitely improbable configuration of energy."Boltzmann, L. (1974). The second law of thermodynamics. p. 20
Boltzmann accomplished the feat of showing that the second law of thermodynamics is only a statistical fact. The gradual disordering of energy is analogous to the disordering of an initially ordered pack of cards under repeated shuffling, and just as the cards will finally return to their original order if shuffled a gigantic number of times, so the entire universe must some-day regain, by pure chance, the state from which it first set out. (This optimistic coda to the idea of the dying universe becomes somewhat muted when one attempts to estimate the timeline which will probably elapse before it spontaneously occurs.)"Collier's Encyclopedia", Volume 19 Phyfe to Reni, "Physics", by David Park, p. 15 The tendency for entropy increase seems to cause difficulty to beginners in thermodynamics, but is easy to understand from the standpoint of the theory of probability. Consider two ordinary dice, with both sixes face up. After the dice are shaken, the chance of finding these two sixes face up is small (1 in 36); thus one can say that the random motion (the agitation) of the dice, like the chaotic collisions of molecules because of thermal energy, causes the less probable state to change to one that is more probable. With millions of dice, like the millions of atoms involved in thermodynamic calculations, the probability of their all being sixes becomes so vanishingly small that the system must move to one of the more probable states."Collier's Encyclopedia", Volume 22 Sylt to Uruguay, Thermodynamics, by Leo Peters, p. 275
It was only after experiments, such as Jean Perrin's studies of colloidal suspensions, confirmed the values of the Avogadro constant and the Boltzmann constant that the existence of atoms and molecules gained wider acceptance. Boltzmann's kinetic theory played a crucial role in demonstrating the reality of atoms and molecules and explaining various phenomena in gases, liquids, and solids.
Max Planck later named the constant as the Boltzmann constant in honor of Boltzmann's contributions to statistical mechanics. The Boltzmann constant is now a fundamental constant in physics and across many scientific disciplines.
Quantization of energy levels became a fundamental postulate in quantum mechanics, leading to groundbreaking theories like quantum electrodynamics and quantum field theory. Thus, Boltzmann's early insights into the quantization of energy levels had a profound influence on the development of quantum physics.
|
|